Goto

Collaborating Authors

 energy consumption


Prioritizing energy intelligence for sustainable growth

MIT Technology Review

As AI drives extraordinary power demands, energy intelligence is rapidly becoming a core business metric. Loudoun County, Virginia, once known for its pastoral scenery and proximity to Washington, DC, has earned a more modern reputation in recent years: The area has the highest concentration of data centers on the planet. Ten years ago, these facilities powered email and e-commerce. Today, thanks to the meteoric rise in demand for AI-infused everything, local utility Dominion Energy is working hard to keep pace with surging power demands. The pressure is so acute that Dulles International Airport is constructing the largest airport solar installation in the country, a highly visible bid to bolster the region's power mix. Data center campuses like Loudoun's are cropping up across the country to accommodate an insatiable appetite for AI.



NeuralFuse: Learning to Recover the Accuracy of Access-Limited Neural Network Inference in Low-Voltage Regimes Hao-Lun Sun

Neural Information Processing Systems

Energy-efficient computing is of primary importance to the effective deployment of deep neural networks (DNNs), particularly in edge devices and in on-chip AI systems. Increasing DNN computation's energy efficiency and lowering its carbon footprint require iterative efforts from both chip designers and algorithm developers.


Appendix Table of Contents

Neural Information Processing Systems

Our datasets and code are available via the following links: Github: https://github.com/NREL/BuildingsBench As described in Sec. 3 and Sec. 4, Buildings-900K and the BuildingsBench benchmark datasets are B.1 Motivation Q: For what purpose was the dataset created? It specifically addresses a lack of appropriately sized and diverse datasets for pretraining STLF models. We emphasize that the EULP was not originally developed for studying STLF. Rather, it was developed as a general resource to "...help electric utilities, grid operators, manufacturers, Q: Who created the dataset (e.g., which team, research group) and on behalf of which entity Q: Who funded the creation of the dataset?





The Technologies Changing How You'll Watch the 2026 Winter Olympic Games

WIRED

From drones with "first-person" visualization to real-time 360-degree replays and Olympics GPT, get ready to immerse yourself in the Winter Games in Milan and Cortina. During the 2024 Summer Olympics in Paris, 5G and 4K were the leading technologies available to many viewers. There was some AI, but it was mostly used for athletes' benefit. For the 2026 Milano Cortina Winter Games there will be more technology than ever, for both athletes and fans. Much of that technology has never been used at the Games before, says Yiannis Exarchos, the managing director of Olympic Broadcasting Services and executive director of Olympic Channel Services.


SpikedAttention: Training-Free and Fully Spike-Driven Transformer-to-SNN Conversion with Winner-Oriented Spike Shift for Softmax Operation

Neural Information Processing Systems

Event-driven spiking neural networks(SNNs) are promising neural networks that reduce the energy consumption of continuously growing AI models. Recently, keeping pace with the development of transformers, transformer-based SNNs were presented.


Adder Attention for Vision Transformer

Neural Information Processing Systems

Transformer is a new kind of calculation paradigm for deep learning which has shown strong performance on a large variety of computer vision tasks. However, compared with conventional deep models (e.g., convolutional neural networks), vision transformers require more computational resources which cannot be easily deployed on mobile devices. To this end, we present to reduce the energy consumptions using adder neural network (AdderNet). We first theoretically analyze the mechanism of self-attention and the difficulty for applying adder operation into this module. Specifically, the feature diversity, i.e., the rank of attention map using only additions cannot be well preserved. Thus, we develop an adder attention layer that includes an additional identity mapping. With the new operation, vision transformers constructed using additions can also provide powerful feature representations. Experimental results on several benchmarks demonstrate that the proposed approach can achieve highly competitive performance to that of the baselines while achieving an about 2~3 reduction on the energy consumption.